Directed Random Search for Multiple Layer Perceptron Training

نویسندگان

  • Udo Seiffert
  • Bernd Michaelis
چکیده

Although Backpropagation (BP) is commonly used to train Multiple Layer Perceptron (MLP) neural networks and its original algorithm has been significantly improved several times, it still suffers from some drawbacks like being slow, getting stuck in local minima or being bound to constraints regarding the activation (transfer) function of the neurons. This paper presents the substitution of backpropagation by a random search technique which has been enhanced by a directed component. By means of some benchmark problems a case study shows general potential application fields as well as advantages and disadvantages of both the Backpropagation and the Directed Random Search (DRS).

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

منابع مشابه

An Incremental Neural Network Construction Algorithm for Training Multilayer Perceptrons

The problem of determining the architecture of a multilayer perceptron together with the disadvantages of the standard backpropagation algorithm, directed the research towards algorithms that determine not only the weights but also the structure of the network necessary for learning the data. We propose a Constructive Algorithm with Multiple Operators using Statistical Test (MOST) for determini...

متن کامل

Using Multiple Node Types to Improve the Performance of DMP ( Dynamic Multilayer Perceptron )

This paper discusses a method for training multi-layer perceptron networks called DMP2 (Dynamic Multi-layer Perceptron 2). The method is based upon a divide and conquer approach which builds networks in the form of binary trees, dynamically allocating nodes and layers as needed. The focus of this paper is on the effects of using multiple node types within the DMP framework. Simulation results s...

متن کامل

Prospective Hardware Implementation of the Chir Neural Network Algorithm

I review the recently developed Choice of Internal Representations (CHIR) training algorithm for multi-layer perceptrons, with an emphasis on relevant properties for hardware implementation. A comparison to the common error back-propagation algorithm shows that there are potential advantages in realizing CHIR in hard-

متن کامل

Investigation of Weight Reuse in Multi-Layer Perceptron Networks for Accelerating the Solution of Differential Equations

Research has shown that training multi-layer perceptron networks to solve ordinary and partial differential equations (DEs) can be accelerated by reusing network weights from a previously solved similar problem. This paper compares weight reuse for two existing methods of defining the network error function. Weight reuse is shown to accelerate training of one ordinary and two partial DEs even f...

متن کامل

Wavelet Packet Multi-layer Perceptron for Chaotic Time Series Prediction: Effects of Weight Initialization

We train the wavelet packet multi-layer perceptron neural network (WP-MLP) by backpropagation for time series prediction. Weights in the backpropagation algorithm are usually initialized with small random values. If the random initial weights happen to be far from a good solution or they are near a poor local optimum, training may take a long time or get trap in the local optimum. Proper weight...

متن کامل

ذخیره در منابع من


  با ذخیره ی این منبع در منابع من، دسترسی به آن را برای استفاده های بعدی آسان تر کنید

برای دانلود متن کامل این مقاله و بیش از 32 میلیون مقاله دیگر ابتدا ثبت نام کنید

ثبت نام

اگر عضو سایت هستید لطفا وارد حساب کاربری خود شوید

عنوان ژورنال:

دوره   شماره 

صفحات  -

تاریخ انتشار 2001